# Academic research optimization

Gams 9B Instruct
GaMS-9B-Instruct is a Slovenian generation model based on Google's Gemma 2 series, supporting Slovenian, English, and partially Croatian, Serbian, and Bosnian, with a focus on text generation tasks.
Large Language Model Safetensors Supports Multiple Languages
G
cjvt
1,652
2
Britllm 3b V0.1
BritLLM is an unprocessed pre-trained model that supports multiple UK native languages and is suitable for various natural language processing tasks, but requires further fine-tuning to adapt to most usage scenarios.
Large Language Model Transformers Supports Multiple Languages
B
britllm
151
5
Discolm 70B GGUF
DiscoLM 70B is a German-optimized large language model based on LeoLM 70B, supporting English and German, suitable for text generation tasks.
Large Language Model Supports Multiple Languages
D
TheBloke
2,122
4
Orionstar Yi 34B Chat Llama
Apache-2.0
An open-source dialogue model fine-tuned based on the Yi-34B model, trained on 15 million high-quality corpus entries, delivering exceptional interactive experiences.
Large Language Model Transformers
O
OrionStarAI
95
14
Llm Embedder Math
The LLM-Embedder model fine-tuned on mathematical datasets, focusing on sentence similarity tasks.
Text Embedding Transformers
L
horychtom
19
0
Camel Platypus2 70B
Camel-Platypus2-70B is a large language model merged from Platypus2-70B and qCammel-70-x, based on the LLaMA 2 architecture, focusing on STEM and logical reasoning tasks.
Large Language Model Transformers English
C
garage-bAInd
114
15
Platypus2 70B Instruct
Platypus2-70B-instruct is a large language model based on the LLaMA 2 architecture, created by merging models from garage-bAInd and upstageAI, focusing on instruction following and logical reasoning tasks.
Large Language Model Transformers English
P
garage-bAInd
1,332
175
Bert Tiny Chinese Ws
Gpl-3.0
A tiny Chinese BERT model developed by Academia Sinica's CKIP team, suitable for Traditional Chinese natural language processing tasks
Sequence Labeling Transformers Chinese
B
ckiplab
1,458
1
Bert Tiny Chinese
Gpl-3.0
A Traditional Chinese Transformer model developed by the Academia Sinica CKIP team, including architectures such as ALBERT, BERT, GPT2, and natural language processing tools
Large Language Model Transformers Chinese
B
ckiplab
689
7
Albert Tiny Chinese Ws
Gpl-3.0
Provides Traditional Chinese transformers models and natural language processing tools
Sequence Labeling Transformers Chinese
A
ckiplab
166.28k
6
Albert Base Chinese Ws
Gpl-3.0
A Traditional Chinese natural language processing model developed by the Academia Sinica CKIP team, based on the ALBERT architecture, supporting tasks such as word segmentation and part-of-speech tagging.
Sequence Labeling Transformers Chinese
A
ckiplab
1,498
1
It5 Base Wiki Summarization
Apache-2.0
Italian Wikipedia summarization model fine-tuned on the WITS dataset, capable of generating concise summaries from Italian text.
Text Generation Other
I
gsarti
18
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase